76 research outputs found

    Application of DInSAR-GPS optimization for derivation of fine-scale surface motion maps of Southern California

    Get PDF
    A method based on random field theory and Gibbs-Markov random fields equivalency within Bayesian statistical framework is used to derive 3-D surface motion maps from sparse global positioning system (GPS) measurements and differential interferometric synthetic aperture radar (DInSAR) interferogram in the southern California region. The minimization of the Gibbs energy function is performed analytically, which is possible in the case when neighboring pixels are considered independent. The problem is well posed and the solution is unique and stable and not biased by the continuity condition. The technique produces a 3-D field containing estimates of surface motion on the spatial scale of the DInSAR image, over a given time period, complete with error estimates. Significant improvement in the accuracy of the vertical component and moderate improvement in the accuracy of the horizontal components of velocity are achieved in comparison with the GPS data alone. The method can be expanded to account for other available data sets, such as additional interferograms, lidar, or leveling data, in order to achieve even higher accuracy

    Analytical Optimization of a DInSAR and GPS Dataset for Derivation of Three-Dimensional Surface Motion

    Full text link

    An Ising model for earthquake dynamics

    Get PDF
    This paper focuses on extracting the information contained in seismic space-time patterns and their dynamics. The Greek catalog recorded from 1901 to 1999 is analyzed. An Ising Cellular Automata representation technique is developed to reconstruct the history of these patterns. We find that there is strong correlation in the region, and that small earthquakes are very important to the stress transfers. Finally, it is demonstrated that this approach is useful for seismic hazard assessment and intermediate-range earthquake forecasting

    Using earthquake intensities to forecast earthquake occurrence times

    No full text
    International audienceIt is well known that earthquakes do not occur randomly in space and time. Foreshocks, aftershocks, precursory activation, and quiescence are just some of the patterns recognized by seismologists. Using the Pattern Informatics technique along with relative intensity analysis, we create a scoring method based on time dependent relative operating characteristic diagrams and show that the occurrences of large earthquakes in California correlate with time intervals where fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering. Furthermore, we show that the methods used to obtain these results may be applicable to other parts of the world

    Improved Real-Time Natural Hazard Monitoring Using Automated DInSAR Time Series

    Get PDF
    As part of the collaborative GeoSciFramework project, we are establising a monitoring system for the Yellowstone volcanic area that integrates multiple geodetic and seismic data sets into an advanced cyber-infrastructure framework that will enable real-time streaming data analytics and machine learning and allow us to better characterize associated long- and short-term hazards. The goal is to continuously ingest both remote sensing (GNSS, DInSAR) and ground-based (seismic, thermal and gas observations, strainmeter, tiltmeter and gravity measurements) data and query and analyse them in near-real time. In this study, we focus on DInSAR data processing and the effects from using various atmospheric corrections and real-time orbits on the automated processing and results. We find that the atmospheric correction provided by the European Centre for Medium-Range Weather Forecasts (ECMWF) is currently the most optimal for automated DInSAR processing and that the use of real-time orbits is sufficient for the early-warning application in question. We show analysis of atmospheric corrections and using real-time orbits in a test case over the Kilauea volcanic area in Hawaii. Finally, using these findings, we present results of displacement time series in the Yellowstone area between May 2018 and October 2019, which are in good agreement with GNSS data where available. These results will contribute to a baseline model that will be the basis of a future early-warning system that will be continuously updated with new DInSAR data acquisitions

    Earthquake forecasting and its verification

    Get PDF
    No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months). However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI) method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ('hotspots'') where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver) operating characteristic (ROC) diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI) forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances

    Gravity changes from a stress evolution earthquake simulation of California

    Get PDF
    The gravity signal contains information regarding changes in density at all depths and can be used as a proxy for the strain accumulation in fault networks. A stress evolution time-dependent model was used to create simulated slip histories over the San Andreas Fault network in California. Using a linear sum of the gravity signals from each fault segment in the model, via coseismic gravity Green's functions, a time-dependent gravity model was created. The steady state gravity from the long-term plate motion generates a signal over 5 years with magnitudes of ±~2 μGal; the current limit of portable instrument observations. Moderate to large events generate signal magnitudes in the range of ~10 to ~80 μGal, well within the range of ground-based observations. The complex fault network geometry of California significantly affects the spatial extent of the gravity signal from the three events studied.Peer reviewe

    Earthquake Statistics in Models and Data

    Get PDF
    Earthquake Statistics in Models and Dat
    corecore